360 research outputs found

    Creep and locking of a low-angle normal fault: Insights from the Altotiberina fault in the Northern Apennines (Italy)

    Get PDF
    While low-angle normal faults have been recognized worldwide from geological studies, whether these structures are active or capable of generating big earthquakes is still debated. We provide new constraints on the role and modes of the Altotiberina fault (ATF) in accommodating extension in the Northern Apennines. We model GPS velocities to study block kinematics, faults slip rates and interseismic coupling of the ATF, which is active and accounts, with its antithetic fault, for a large part of the observed chain normal 3 mm/yr tectonic extension. A wide portion of the ATF creeps at the long-term slip rate (1.7 \ub1 0.3 mm/yr), but the shallow locked portions are compatible with M > 6.5 earthquakes. We suggest that positive stress accumulation due to ATF creep is most likely released by more favorable oriented splay faults, whose rupture may propagate downdip along low-angle normal fault surface and reduce the probability of occurrence of a seismic rupture of the shallower locked portion

    A Preliminary Study on Flexible Temperature Sensors for Eskin Medical Devices

    Get PDF
    In the latest years, the need of a renewed paradigm for healthcare arose promoting the research towards the idea of remote diagnosis, care and monitoring of physiological parameters. Thus, the wearable and eskin devices arose to be embedded in the standard medical equipment. In this work, a preliminary study on flexible AJP-printed temperature sensors is reported in order to propose a novel approach to evaluate infection sites, monitor the body temperature and compensate the effects of temperature on other on-body sensors. Two different geometries are proposed, designed, produced, evaluated and compared. The results shown a similar dependance on temperature (average TCR = 2.5 ∙ 10 -3 °C-1) and the dependance on substrate deformation was enquired as well as the geometrical features of the sensors

    Interference of tectonic signals in subsurface hydrologic monitoring through gravity and GPS due to mountain building

    Get PDF
    Global Positioning System observations in the Alps have now sufficient precision to reliably observe vertical surface movement rates at the orogen scale. The geodynamic modeling of converging plate margins requires constraints on the origin of orogenic uplift, of which the two end members are pure crustal uplift and crustal thickening, respectively. Gravity change rates joint with uplift measurements allows to distinguish the two mechanisms. We use vertical uplift rates over the Alpine range and the southern foreland basin, to predict the gravity change for different geodynamic hypotheses of pure uplift and mantle inflow, or crustal thickening and isostatic Moho lowering. The sensitivity of gravity as a tool to distinguish the two mechanisms is investigated. This model differs from the predicted isostatic movements, based on the glacial history and the mantle viscosity, since the uplift is measured and not predicted. The estimate of this tectonic signal is important, when gravity change rates, as those obtained from GRACE, are interpreted exclusively in terms of hydrologic changes tied to climatic variation. It has been already shown that in some areas, as the Tibetan plateau and the Himalayas, the tectonic signal is not negligible. Here we estimate the effect of the tectonic signal for the uplift of smaller mountain ranges, as is the Alpine arc. Our results indicate that tectonic and hydrological signals superpose and we cannot ignore the tectonic signal when using GRACE to invert for the equivalent water height (EWH)

    Analysis of High-Rate GPS Data Collected During the L’Aquila Seismic Sequence

    Get PDF
    Four days before the 6th April M5.8 L’Aquila main-shock, a few GPS receivers recording at 10Hz and 1Hz sampling rates have been set up by INGV in the area affected by the seismic swarm in place by mid-January 2009. These data allowed us to measure for the first time in Italy the dynamic co-seismic displacements with periods ranging from fractions of seconds to several minutes and the full time spectra of the surface co-seismic and early post-seismic deformation with GPS instruments. We use TRACK, the kinematic module of the GAMIT/GLOBK software package, to perform epoch-by-epoch solutions of GPS raw data to obtain 3D time series of surface displacements. TRACK uses floating point LC (L3) observations between pairs of stations and the Mebourne-Wubena Wide Lane combination, with ionospheric constraints, to determine integer ambiguities at each epoch. It requires a fixed station and one, or more, kinematic stations. Usually, the static station is chosen to be far enough from the epicentral area not to be affected by the co-seismic displacements. Since no automatic processing engine exists for TRACK, we built a new shell script, which take full advantage of the Linux CPU-cluster routinely used to analyze 30 seconds GPS data with the GAMIT at INGV-Bologna. The new tool allows to automatically process pairs of stations (i.e., a network) and getting raw time series of several stations simultaneously (depending on the number of cluster nodes available) in a few seconds or minute, depending on the length of the session analyzed. TRACK uses broadcasted, ultra-rapid (containing predictions), rapid and final IGS orbits, thus making quasi-real time processing possible, and actually limited by the access to remote raw high rate GPS data archives. Since that the only two stations recording 10Hz data in the L’Aquila region are located close to the main shock epicenter and no data were available at other sites in Italy, we built a new tool to generate a virtual far field reference station acquiring 10Hz data by interpolating the available 1Hz RINEX data. The interpolated sites permit to properly solve the epoch-by-epoch position of the epicentral sites with the TRACK module. High frequency GPS data are severely affected by multipath noise, which can reach the same magnitude of the co-seismic displacements, and need to be removed consistently. For this reason, we investigate the effect of time and space-wise filters (sidereal and common mode filters) and set up a Matlab tool to perform time and spatial filtering on the raw time series produced by our processing tool. High rate data allow to measure the real static co-seismic offsets, which are not contaminated by early afterslip, which may occur in the next few hours after the earthquake. We analyze 10Hz data from 2 stations (Fig. 1) belonging to the CAGEONET network (Anzidei et al., 2009), and 1Hz data from 75 continuous GPS stations, located in central, southern and northern Italy, for which data are available for the 6th of April. A data quality inspection of the available high rate rinex files has been used to select the reference station, and single baselines solutions have been then resolved. We apply both spatial (common-mode) and temporal (sideral) filters to improve the signal to noise ratio of the observed displacements and estimate the epoch and the static co-seismic offsets. The 3D co-seismic displacement field has been used to invert, using rectangular (Okada, 1985), uniform-slip dislocations embedded in an elastic, homogeneous and isotropic half-space and a constrained, non-linear optimization algorithm (Burgmann et al., 1997), the best fit rectangular dislocation geometry and fault slip distribution, which has been compared with the fault geometry and slip model obtained from the analysis of standard 30 sec 24 hours data

    Linee guida per la programmazione remota e lo scarico dati di ricevitori GPS (Standard Trimble Navigation Limited - Serie 5000).

    Get PDF
    Nella fase di implementazione dell'hardware e del software per la materializzazione di stazioni GPS permanenti il problema della programmazione e scarico dati a distanza mediante modem da telefonia fissa, PSTN e ISDN o mediante modem da telefonia mobile, essenzialmente GSM /GPRS costituisce un problema di non facile soluzione. Anche laddove esista da parte delle aziende coinvolte nelle forniture un assistenza software che consenta di sviluppare dei prodotti personalizzati, alcune operazioni richieste vengono spesso eseguite dall'utenza con mezzi propri. Infatti, gli standard commerciali che vanno per la maggiore operano prevalentemente in ambiente Microsoft Windows e sono spesso privi della modularità e della flessibilità necessarie sia per adattarsi a problemi scientifici sia per la loro esecuzione in background in ambienti Unix (Linux, Solaris o HP-UX). In questo lavoro vengono analizzate le problematiche legate alla programmazione di ricevitori Trimble Navigation Limited modello 5700 CORS (Continuous Operating Reference Stations), in dotazione al Laboratorio di Geodesia e Telerilevamento dell'INGV-CNT, per un loro utilizzo nell’ambito di stazioni GPS permanenti. In particolare, è stato sviluppato un sistema automatico per il controllo a distanza, la programmazione e lo scarico dati di uno standard dei predetti ricevitori. Tutta la logistica della cascata hardware e dello strato software è stata sviluppata mediante strumenti da noi implementati; il risultato è sufficientemente stabile e può essere fattivamente utilizzato per il supporto alla realizzazione di stazioni permanenti GPS anche in siti dalla logistica scarsa come quelli situati in zone remote prive di linee telefoniche fisse e di alimentazione. Per il suddetto tipo di strumentazione infatti, non esiste in Italia una vera e propria assistenza Hardware e Software in grado di risolvere determinati problemi e, pertanto, in alcuni casi è stato necessario ricorrere sia al supporto tecnico dell'UNAVCO, il consorzio universitario Americano per il NAVSTAR GPS (http://www.unavco.org), sia alla nostra consolidata esperienza nel campo della strumentazione GPS. Il presente documento è rivolto ad un pubblico limitato costituito da quegli esperti del settore che siano interessati alla soluzione delle problematiche analizzate in questa sede

    PROGETTAZIONE E REALIZZAZIONE DI UN COMPUTER-CLUSTER PER L’ANALISI DATI GPS CON I SOFTWARE GAMIT E QOCA

    Get PDF
    Negli ultimi 5 anni si è assistito ad un rapido aumento del numero di reti di stazioni GPS continue (CGPS) attive sul territorio Italiano e, più in generale, nell’area Mediterranea. Se da un lato lo sviluppo delle reti CGPS per lo studio dei fenomeni geofisici (terremoti, vulcani, variazioni del livello del mare, ecc...) è ancora legato a particolari programmi di ricerca nei diversi paesi del bacino Mediterraneo, dall’altro un po’ in tutta Europa, ma anche in alcune aree del continente Africano, si è assistito alla nascita di reti CGPS realizzate per scopi diversi da quelli geofisici (cartografici, topografici, catastali o per la navigazione). Se da una parte le reti CGPS realizzate con criteri “geofisici” [es., Anzidei & Esposito, 2003] forniscono un dato generalmente più affidabile, in termini di stabilità delle monumentazioni, qualità del dati e continuità temporale delle osservazioni, dall’altra le reti CGPS regionali di tipo “non-geofisico”, nonostante una distribuzione ovviamente disomogenea, hanno dimostrato di fornire comunque informazioni utili alla stima dei campi di velocità e di deformazione crostale [es., D’Agostino et al., 2008], e di integrarsi il più delle colte con altre reti di tipo “geofisico” esistenti. Al fine di migliorare la risoluzione spaziale del segnale tettonico misurabile da una rete GPS, la scelta di realizzare un computer cluster per l’analisi dati GPS è stata presa al fine di garantire un rapido, ed il più possibile automatico, processamento di tutti i dati a disposizione per l’area Euro-Mediterranea ed Africana. I software comunemente utilizzati in ambito scientifico per l’analisi dei dati GPS sono il GAMIT/GLOBK il BERNESE ed il GIPSY. Al di là delle differenze legate agli algoritmi di calcolo dei tre software in questione, e dei vantaggi o svantaggi di uno e dell’altro approccio di cui necessitano, una corretta progettazione della dotazione hardware e software è il passaggio fondamentale per la creazione di un moderno ed efficiente centro di analisi dati GPS finalizzato alla razionalizzazione delle risorse e dei costi. Dato il numero molto elevato di stazioni CGPS oggi potenzialmente disponibili (diverse centinaia per la sola area Mediterranea), una procedura che analizzi simultaneamente tutte le stazioni è difficilmente praticabile. Nonostante recenti sviluppi di nuovi algoritmi [Blewitt, 2008] rendano effettivamente possibile un’analisi simultanea di “mega-reti”, anche a scala globale, la disponibilità di calcolo su sistemi multi- processore risulta comunque fondamentale. Nel caso specifico in cui il software utilizzato per l’analisi dei dati si basi su soluzioni di rete (network solutions), come il BERNESE ed il GAMIT, riveste fondamentale importanza lo sfruttamento ottimale delle risorse computazionali, e soprattutto la possibilità di sfruttare appieno le potenzialità sia dei più recenti computer multi-processore che dei nuovi processori ad architettura multi-core. Nessuno dei software indicati precedentemente è implementato per il calcolo parallelo, di conseguenza, lo sfruttamento delle architetture multi-processore o multi-core deve passare necessariamente per altre vie. Una di queste è quella del calcolo distribuito (distributed-processing), in cui, ad esempio, diversi nodi di calcolo (che possono essere diverse macchine, diversi processori, o diversi core di processori) analizzano reti CGPS diverse, o diversi giorni della stessa rete CGPS. Se da una parte il mercato offre numerose soluzioni commerciali per la realizzazione di procedure di calcolo distribuito (Microsoft Windows Compute Cluster Server 2003, Sun Cluster, NEC ExpressCluster; IBM Parallel Sysplex, per citarne alcuni), dall’altra la disponibilità di software open source per questo tipo di scopi è oggi completa e ben integrata nei sistemi operativi UNIX based.   In questo rapporto tecnico viene descritta la procedura seguita per la realizzazione di un nuovo server per l’analisi dei dati GPS presso la Sede INGV di Bologna basato su un computer cluster, utilizzando software Open Source, in ambiente GNU/Linux

    A New Semi-Continuous GPS Network and Temporary Seismic Experiment Across the Montello-Conegliano Fault System (NE-Italy)

    Get PDF
    The Montello–Conegliano Thrust is the most remarkable structure of the Southern Alpine fault belt in the Veneto-Friuli plain, as a result of the conspicuous morphological evidence of the Montello anticline, which is associated to uplifted and deformed river terraces, diversion of the course of the Piave River, as well as vertical relative motions registered by leveling lines (Galadini et al., 2005; Burrato et al., 2008). Many papers dealt with its geometry and evolution, and the presence of several orders of Middle and Upper Pleistocene warped river terraces (Benedetti et al., 2000) in the western sector strongly suggests that the Montello–Conegliano anticline is active and driven by the underlying thrust. However, in spite of the spectacular geomorphic and geologic evidence of activity of the Montello-Conegliano Thrust, there is only little evidence on how much contractional strain is released through discrete events (i.e. earthquakes) and how much goes aseismic. Benedetti et al. (2000) hypothesized that the western part of the thrust (Montello) may have slipped three times in the past 2000 years (during the Mw 5.8 778 A.D., Mw 5.4 1268 and Mw 5.0 1859 earthquakes), yielding a mean recurrence time of about 500 years, whereas, the eastern part of the thrust (Conegliano) would be silent. The Italian seismic catalogues have very poor-quality and incomplete data for these events associated with the Montello thrust, leaving room for different interpretations, as for example the possibility that these earthquakes were generated by nearby secondary structures. In this latter case, the whole Montello–Conegliano Thrust would represent a major “silent” structure, with a recurrence interval longer than 700 years, because none of the historical earthquakes reported in the Italian Catalogues of seismicity for the past seven centuries can be convincingly referred to the Montello Source. Given the uncertainties regarding the seismic potential of this segment of the Southern Alpine fault system, we designed and realized a new GPS network across the Montello region (Fig. 1), with the goal of detecting the present-day velocity gradient pattern and develop models of the inter-seismic deformation (i.e., geometry, kinematics and coupling of the seismogenic fault). In the 2009, we started realizing a new concept of GPS experiment, called “semi-continuous”. As the name suggests, the method involves moving a set of GPS receivers around a permanently installed network of monuments, such that each station is observed some fraction of the time. In practice, a set of GPS receivers can literally remain in the field for their entire life span, thus maximizing their usage. The monuments are designed with special mounts so that the GPS antenna is forced to the same physical location at each site. This has the advantage of mitigating errors (including possible blunders) in measuring the antenna height and in centering the antenna horizontally. This also has the advantage of reducing variation in multipath bias from one occupation session to another. The period of each “session” depends on the design of the operations. At one extreme, some stations might act essentially as permanent stations (though the equipment is still highly mobile), thus providing a level of reference frame stability, and some stations may only be occupied every year or two, in order to extend or increase the density of a network’s spatial coverage. In this work we will present the motivations and tools used to develop and implement the new GPS network. During the 2010 we will integrate the existing GPS network with 10 mobile seismic stations, belonging to the INGV mobile network, with the goal of illuminate local micro-seismicity patterns that would help constraining the locked fault geometry
    corecore